2,124 research outputs found

    Development of a simple information pump.

    Get PDF
    The Information Pump (IP) is a methodology that aims to counter the problems arising from traditional subjective product data collection. The IP is a game theory based process that aims to maximise information extracted from a panel of subjects, while maintaining their interest in the process through a continuous panelist scoring method. The challenge with implementing this arises from the difficulty in executing the 'game'. In its original format, there is an assumption that the game is played with each player using their own PC to interact with the process. While this in theory allows information and scores to flow in a controlled manner between the players, it actually provides a major barrier to the wider adoption of the IP method. This barrier is two-fold: it is costly and complex, and it is not a natural manner for exchanging information. The core objective is to develop a low cost version of the IP method. This will use the game theory approach to maintain interest among participants and maximise information extraction, but remove the need for each participant to have their own PC interface to the game. This will require replacing both the inter-player communication method and the score keeping/reporting

    Multiple mitochondrial introgression events and heteroplasmy in trypanosoma cruzi revealed by Maxicircle MLST and next generation sequencing

    Get PDF
    Background Mitochondrial DNA is a valuable taxonomic marker due to its relatively fast rate of evolution. In Trypanosoma cruzi, the causative agent of Chagas disease, the mitochondrial genome has a unique structural organization consisting of 20–50 maxicircles (∼20 kb) and thousands of minicircles (0.5–10 kb). T. cruzi is an early diverging protist displaying remarkable genetic heterogeneity and is recognized as a complex of six discrete typing units (DTUs). The majority of infected humans are asymptomatic for life while 30–35% develop potentially fatal cardiac and/or digestive syndromes. However, the relationship between specific clinical outcomes and T. cruzi genotype remains elusive. The availability of whole genome sequences has driven advances in high resolution genotyping techniques and re-invigorated interest in exploring the diversity present within the various DTUs. Methodology/Principal Findings To describe intra-DTU diversity, we developed a highly resolutive maxicircle multilocus sequence typing (mtMLST) scheme based on ten gene fragments. A panel of 32 TcI isolates was genotyped using the mtMLST scheme, GPI, mini-exon and 25 microsatellite loci. Comparison of nuclear and mitochondrial data revealed clearly incongruent phylogenetic histories among different geographical populations as well as major DTUs. In parallel, we exploited read depth data, generated by Illumina sequencing of the maxicircle genome from the TcI reference strain Sylvio X10/1, to provide the first evidence of mitochondrial heteroplasmy (heterogeneous mitochondrial genomes in an individual cell) in T. cruzi. Conclusions/Significance mtMLST provides a powerful approach to genotyping at the sub-DTU level. This strategy will facilitate attempts to resolve phenotypic variation in T. cruzi and to address epidemiologically important hypotheses in conjunction with intensive spatio-temporal sampling. The observations of both general and specific incidences of nuclear-mitochondrial phylogenetic incongruence indicate that genetic recombination is geographically widespread and continues to influence the natural population structure of TcI, a conclusion which challenges the traditional paradigm of clonality in T. cruzi

    Motives of some Fano varieties

    Get PDF
    We study the Fano varieties of projective k-planes lying in hypersurfaces and investigate the associated motives. © 2008 Springer-Verlag

    Semi-supervised prediction of protein interaction sentences exploiting semantically encoded metrics

    Get PDF
    Protein-protein interaction (PPI) identification is an integral component of many biomedical research and database curation tools. Automation of this task through classification is one of the key goals of text mining (TM). However, labelled PPI corpora required to train classifiers are generally small. In order to overcome this sparsity in the training data, we propose a novel method of integrating corpora that do not contain relevance judgements. Our approach uses a semantic language model to gather word similarity from a large unlabelled corpus. This additional information is integrated into the sentence classification process using kernel transformations and has a re-weighting effect on the training features that leads to an 8% improvement in F-score over the baseline results. Furthermore, we discover that some words which are generally considered indicative of interactions are actually neutralised by this process

    Gene expression time delays & Turing pattern formation systems

    Get PDF
    The incorporation of time delays can greatly affect the behaviour of partial differential equations and dynamical systems. In addition, there is evidence that time delays in gene expression due to transcription and translation play an important role in the dynamics of cellular systems. In this paper, we investigate the effects of incorporating gene expression time delays into a one-dimensional putative reaction diffusion pattern formation mechanism on both stationary domains and domains with spatially uniform exponential growth. While oscillatory behaviour is rare, we find that the time taken to initiate and stabilise patterns increases dramatically as the time delay is increased. In addition, we observe that on rapidly growing domains the time delay can induce a failure of the Turing instability which cannot be predicted by a naive linear analysis of the underlying equations about the homogeneous steady state. The dramatic lag in the induction of patterning, or even its complete absence on occasions, highlights the importance of considering explicit gene expression time delays in models for cellular reaction diffusion patterning

    Computational modelling of placental amino acid transfer as an integrated system

    Get PDF
    AbstractPlacental amino acid transfer is essential for fetal development and its impairment is associated with poor fetal growth. Amino acid transfer is mediated by a broad array of specific plasma membrane transporters with overlapping substrate specificity. However, it is not fully understood how these different transporters work together to mediate net flux across the placenta. Therefore the aim of this study was to develop a new computational model to describe how human placental amino acid transfer functions as an integrated system. Amino acid transfer from mother to fetus requires transport across the two plasma membranes of the placental syncytiotrophoblast, each of which contains a distinct complement of transporter proteins. A compartmental modelling approach was combined with a carrier based modelling framework to represent the kinetics of the individual accumulative, exchange and facilitative classes of transporters on each plasma membrane. The model successfully captured the principal features of transplacental transfer. Modelling results clearly demonstrate how modulating transporter activity and conditions such as phenylketonuria, can increase the transfer of certain groups of amino acids, but that this comes at the cost of decreasing the transfer of others, which has implications for developing clinical treatment options in the placenta and other transporting epithelia

    Investigating how faculty social networks and peer influence relate to knowledge and use of evidence-based teaching practices

    Get PDF
    Background: Calls for science education reform have been made for decades in the USA. The recent call to produce one million new science, technology, engineering, and math (STEM) graduates over 10 years highlights the need to employ evidence-based instructional practices (EBIPs) in undergraduate STEM classes to create engaging and effective learning environments. EBIPs are teaching strategies that have been empirically demonstrated to positively impact student learning, attitudes, and achievement in STEM disciplines. However, the mechanisms and processes by which faculty learn about and choose to implement EBIPs remain unclear. To explore this problem area, we used social network analysis to examine how an instructor’s knowledge and use of EBIPs may be influenced by their peers within a STEM department. We investigated teaching discussion networks in biology and chemistry departments at three public universities. Results: We report that tie strength and tie diversity vary between departments, but that mean indegree is not correlated with organizational rank or tenure status. We also describe that teaching discussion ties can often be characterized as strong ties based on two measures of tie strength. Further, we compare peer influence models and find consistent evidence that peer influence in these departments follows a network disturbances model. Conclusions: Our findings with respect to tie strength and tie diversity indicate that the social network structures in these departments vary in how conducive they might be to change. The correlation in teaching practice between discussion partner and peer influence models suggest that change agents should consider local social network characteristics when developing change strategies. In particular, change agents can expect that faculty may serve as opinion leaders regardless of their academic rank and that faculty can increase their use of EBIPs even if those they speak to about teaching use EBIPs comparatively less

    Calibration of cognitive tests to address the reliability paradox for decision-conflict tasks

    Get PDF
    Standard, well-established cognitive tasks that produce reliable effects in group comparisons also lead to unreliable measurement when assessing individual differences. This reliability paradox has been demonstrated in decision-conflict tasks such as the Simon, Flanker, and Stroop tasks, which measure various aspects of cognitive control. We aim to address this paradox by implementing carefully calibrated versions of the standard tests with an additional manipulation to encourage processing of conflicting information, as well as combinations of standard tasks. Over five experiments, we show that a Flanker task and a combined Simon and Stroop task with the additional manipulation produced reliable estimates of individual differences in under 100 trials per task, which improves on the reliability seen in benchmark Flanker, Simon, and Stroop data. We make these tasks freely available and discuss both theoretical and applied implications regarding how the cognitive testing of individual differences is carried out.</p

    The failed liberalisation of Algeria and the international context: a legacy of stable authoritarianism

    Get PDF
    The paper attempts to challenge the somewhat marginal role of international factors in the study of transitions to democracy. Theoretical and practical difficulties in proving causal mechanisms between international variables and domestic outcomes can be overcome by defining the international dimension in terms of Western dominance of world politics and by identifying Western actions towards democratising countries. The paper focuses on the case of Algeria, where international factors are key in explaining the initial process of democratisation and its following demise. In particular, the paper argues that direct Western policies, the pressures of the international system and external shocks influence the internal distribution of power and resources, which underpins the different strategies of all domestic actors. The paper concludes that analysis based purely on domestic factors cannot explain the process of democratisation and that international variables must be taken into more serious account and much more detailed
    corecore